- Console
- Change Kernel…
- Clear Console Cells
- Close and Shut Down…
- Insert Line Break
- Interrupt Kernel
- New Console
- Restart Kernel…
- Run Cell (forced)
- Run Cell (unforced)
- Show All Kernel Activity
- Extension Manager
- Enable Extension Manager
- File Operations
- Autosave Documents
- DownloadDownload the file to your computer
- Open from Path…Open from path
- Reload Notebook from DiskReload contents from disk
- Revert Notebook to CheckpointRevert contents to previous checkpoint
- Save NotebookSave and create checkpointCtrl+S
- Save Notebook As…Save with new pathCtrl+Shift+S
- Show Active File in File Browser
- Trust HTML File
- Help
- Jupyter Reference
- JupyterLab FAQ
- JupyterLab Reference
- Launch Classic Notebook
- Markdown Reference
- Reset Application State
- Image Viewer
- Flip image horizontallyH
- Flip image verticallyV
- Invert ColorsI
- Reset Image0
- Rotate Clockwise]
- Rotate Counterclockwise[
- Zoom In=
- Zoom Out-
- Kernel Operations
- Shut Down All Kernels…
- Language server protocol
- Display the completer themes
- Launcher
- New Launcher
- Main Area
- Activate Next TabCtrl+Shift+]
- Activate Next Tab Bar
- Activate Previous TabCtrl+Shift+[
- Activate Previous Tab Bar
- Activate Previously Used TabCtrl+Shift+'
- Close All Other Tabs
- Close All Tabs
- Close TabAlt+W
- Close Tabs to Right
- Find NextCtrl+G
- Find PreviousCtrl+Shift+G
- Find…Ctrl+F
- Log OutLog out of JupyterLab
- Presentation Mode
- Show Left SidebarCtrl+B
- Show Status Bar
- Shut DownShut down JupyterLab
- Single-Document ModeCtrl+Shift+D
- Notebook Cell Operations
- Change to Code Cell TypeY
- Change to Heading 11
- Change to Heading 22
- Change to Heading 33
- Change to Heading 44
- Change to Heading 55
- Change to Heading 66
- Change to Markdown Cell TypeM
- Change to Raw Cell TypeR
- Clear Outputs
- Collapse All Code
- Collapse All Outputs
- Collapse Selected Code
- Collapse Selected Outputs
- Copy CellsC
- Cut CellsX
- Delete CellsD, D
- Disable Scrolling for Outputs
- Enable Scrolling for Outputs
- Expand All Code
- Expand All Outputs
- Expand Selected Code
- Expand Selected Outputs
- Extend Selection AboveShift+K
- Extend Selection BelowShift+J
- Extend Selection to BottomShift+End
- Extend Selection to TopShift+Home
- Insert Cell AboveA
- Insert Cell BelowB
- Merge Selected CellsShift+M
- Move Cells Down
- Move Cells Up
- Paste Cells Above
- Paste Cells and Replace
- Paste Cells BelowV
- Redo Cell OperationShift+Z
- Run Selected Cells
- Run Selected Cells and Don't Advance
- Run Selected Cells and Insert Below
- Run Selected Text or Current Line in Console
- Select Cell AboveK
- Select Cell BelowJ
- Split CellCtrl+Shift+-
- Undo Cell OperationZ
- Notebook Operations
- Change Kernel…
- Clear All Outputs
- Close and Shut Down
- Deselect All Cells
- Enter Command ModeCtrl+M
- Enter Edit ModeEnter
- Interrupt Kernel
- New NotebookCreate a new notebook
- Reconnect To Kernel
- Render All Markdown Cells
- Restart Kernel and Clear All Outputs…
- Restart Kernel and Run All Cells…
- Restart Kernel and Run up to Selected Cell…
- Restart Kernel…
- Run All Above Selected Cell
- Run All Cells
- Run Selected Cell and All Below
- Select All CellsCtrl+A
- Toggle All Line NumbersShift+L
- Trust Notebook
- Settings
- Advanced Settings Editor
- Terminal
- Decrease Terminal Font Size
- Increase Terminal Font Size
- New TerminalStart a new terminal session
- Refresh TerminalRefresh the current terminal session
- Use Dark Terminal ThemeSet the terminal theme
- Use Inherit Terminal ThemeSet the terminal theme
- Use Light Terminal ThemeSet the terminal theme
- Text Editor
- Decrease Font Size
- Increase Font Size
- Indent with Tab
- New Markdown FileCreate a new markdown file
- New Text FileCreate a new text file
- Spaces: 1
- Spaces: 2
- Spaces: 4
- Spaces: 8
- Theme
- Decrease Code Font Size
- Decrease Content Font Size
- Decrease UI Font Size
- Increase Code Font Size
- Increase Content Font Size
- Increase UI Font Size
- Theme Scrollbars
- Use JupyterLab Dark Theme
- Use JupyterLab Light Theme
[*]:
xxxxxxxxxxfor i in range(1,785):Advanced Tools
xxxxxxxxxxxxxxxxxxxxKernel Sessions
- __notebook_source__.ipynb
Terminal Sessions
- __notebook_source__.ipynb
/
Name
Last Modified
[29]:
xxxxxxxxxx# This Python 3 environment comes with many helpful analytics libraries installed# It is defined by the kaggle/python Docker image: https://github.com/kaggle/docker-python# For example, here's several helpful packages to loadimport numpy as np # linear algebraimport pandas as pd # data processing, CSV file I/O (e.g. pd.read_csv)# Input data files are available in the read-only "../input/" directory# For example, running this (by clicking run or pressing Shift+Enter) will list all files under the input directoryimport osfor dirname, _, filenames in os.walk('/kaggle/input'): for filename in filenames: print(os.path.join(dirname, filename))# You can write up to 20GB to the current directory (/kaggle/working/) that gets preserved as output when you create a version using "Save & Run All" # You can also write temporary files to /kaggle/temp/, but they won't be saved outside of the current session/kaggle/input/digit-recognizer/sample_submission.csv /kaggle/input/digit-recognizer/train.csv /kaggle/input/digit-recognizer/test.csv
[30]:
df = pd.read_csv("/kaggle/input/digit-recognizer/train.csv")[31]:
xxxxxxxxxxdf.sample()[31]:
| label | pixel0 | pixel1 | pixel2 | pixel3 | pixel4 | pixel5 | pixel6 | pixel7 | pixel8 | ... | pixel774 | pixel775 | pixel776 | pixel777 | pixel778 | pixel779 | pixel780 | pixel781 | pixel782 | pixel783 | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 19970 | 6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
1 rows × 785 columns
[32]:
xxxxxxxxxxdf.head()[32]:
| label | pixel0 | pixel1 | pixel2 | pixel3 | pixel4 | pixel5 | pixel6 | pixel7 | pixel8 | ... | pixel774 | pixel775 | pixel776 | pixel777 | pixel778 | pixel779 | pixel780 | pixel781 | pixel782 | pixel783 | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 3 | 4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 rows × 785 columns
[33]:
xxxxxxxxxximport matplotlib.pyplot as plt[34]:
xxxxxxxxxxplt.imshow(df.iloc[8525,1:].values.reshape(28,28))[34]:
<matplotlib.image.AxesImage at 0x7a9f67ecee60>
[35]:
xxxxxxxxxxX = df.iloc[:,1:]y = df.iloc[:,0][36]:
xxxxxxxxxxfrom sklearn.model_selection import train_test_splitX_train,X_test, y_train,y_test = train_test_split(X,y, test_size = 0.2, random_state = 42)[37]:
xxxxxxxxxxX_train.shape[37]:
(33600, 784)
[38]:
xxxxxxxxxxfrom sklearn.neighbors import KNeighborsClassifier[39]:
xxxxxxxxxxknn = KNeighborsClassifier()[40]:
xxxxxxxxxxknn.fit(X_train,y_train)[40]:
KNeighborsClassifier()In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
KNeighborsClassifier()
[41]:
xxxxxxxxxxy_pred = knn.predict(X_test)[42]:
xxxxxxxxxxfrom sklearn.metrics import accuracy_scoreaccuracy_score(y_test, y_pred)[42]:
0.9648809523809524
[43]:
xxxxxxxxxxfrom sklearn.preprocessing import StandardScalerscaler = StandardScaler()[44]:
xxxxxxxxxxX_train = scaler.fit_transform(X_train)X_test = scaler.transform(X_test)playlist_play
x
#PCAfrom sklearn.decomposition import PCApca = PCA(n_components = 100)playlist_play
x
X_train_trf = pca.fit_transform(X_train)X_test_trf = pca.transform(X_test)[47]:
xxxxxxxxxxX_train_trf.shape[47]:
(33600, 100)
playlist_play
x
knn = KNeighborsClassifier()playlist_play
x
knn.fit(X_train_trf,y_train)[49]:
KNeighborsClassifier()In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
KNeighborsClassifier()
playlist_play
x
y_pred = knn.predict(X_test_trf)playlist_play
x
accuracy_score(y_test,y_pred)[51]:
0.9544047619047619
play_arrow
x
for i in range(1,785): pca = PCA(n_components = i) X_train_trf = pca.fit_transform(X_train) X_test_trf = pca.transform(X_test) knn = KNeighborsClassifier() knn.fit(X_train_trf,y_train) y_pred = knn.predict(X_test_trf) print(accuracy_score(y_test,y_pred))0.25761904761904764 0.32404761904761903 0.5103571428571428 0.6666666666666666 0.7377380952380952 0.8227380952380953 0.843452380952381 0.8723809523809524 0.8864285714285715 0.9053571428571429 0.9117857142857143 0.9183333333333333 0.9279761904761905 0.9332142857142857 0.9376190476190476 0.9384523809523809 0.9403571428571429 0.940952380952381 0.9425 0.9435714285714286 0.9435714285714286 0.9442857142857143 0.9444047619047619 0.9448809523809524 0.9478571428571428 0.9484523809523809 0.9475 0.9466666666666667 0.9486904761904762 0.9510714285714286 0.9513095238095238 0.9523809523809523 0.9510714285714286 0.950952380952381 0.9496428571428571 0.9517857142857142 0.9513095238095238 0.9530952380952381 0.9533333333333334 0.950952380952381 0.9520238095238095 0.9522619047619048 0.9530952380952381 0.9547619047619048 0.9532142857142857 0.9530952380952381 0.9527380952380953 0.9544047619047619 0.9533333333333334 0.9532142857142857 0.9541666666666667 0.9548809523809524 0.9542857142857143 0.9544047619047619 0.9538095238095238 0.9536904761904762 0.9547619047619048 0.9545238095238096 0.9539285714285715 0.954047619047619 0.9551190476190476 0.9546428571428571 0.9545238095238096 0.9548809523809524 0.9542857142857143 0.9536904761904762 0.9553571428571429 0.954047619047619 0.954047619047619 0.9554761904761905 0.9546428571428571 0.9541666666666667 0.9551190476190476 0.9545238095238096 0.9534523809523809 0.9544047619047619 0.9544047619047619 0.9551190476190476 0.9546428571428571 0.9526190476190476 0.9545238095238096 0.9542857142857143 0.9526190476190476 0.9548809523809524 0.9547619047619048 0.9541666666666667 0.9545238095238096 0.9547619047619048 0.9545238095238096 0.9552380952380952 0.955 0.9547619047619048 0.955 0.9547619047619048 0.9546428571428571 0.9552380952380952 0.9532142857142857 0.9532142857142857 0.9533333333333334 0.9539285714285715 0.9535714285714286 0.9536904761904762 0.9542857142857143 0.9546428571428571 0.9532142857142857 0.955 0.9530952380952381 0.9538095238095238 0.9542857142857143 0.9544047619047619 0.9526190476190476 0.9529761904761904 0.954047619047619 0.9532142857142857 0.9527380952380953 0.9530952380952381 0.9527380952380953 0.9538095238095238 0.9530952380952381 0.9525 0.9528571428571428 0.9527380952380953 0.9526190476190476 0.9526190476190476 0.9534523809523809 0.9528571428571428 0.9532142857142857 0.9528571428571428 0.9539285714285715 0.9528571428571428 0.9526190476190476 0.9525 0.9521428571428572 0.9530952380952381 0.9528571428571428 0.9528571428571428 0.9522619047619048 0.9526190476190476 0.9533333333333334 0.9522619047619048 0.9525 0.9529761904761904 0.9519047619047619 0.9529761904761904 0.9515476190476191 0.9523809523809523 0.9521428571428572 0.9507142857142857 0.9519047619047619 0.9511904761904761 0.9520238095238095 0.9527380952380953 0.9516666666666667 0.9520238095238095 0.9517857142857142 0.9517857142857142 0.9517857142857142 0.9517857142857142 0.9519047619047619 0.9522619047619048 0.9516666666666667 0.9520238095238095 0.950952380952381 0.950952380952381 0.9515476190476191 0.9508333333333333 0.9516666666666667 0.950595238095238 0.9511904761904761 0.9508333333333333 0.9520238095238095 0.9508333333333333 0.950595238095238 0.9520238095238095 0.9510714285714286 0.9513095238095238 0.9521428571428572 0.9508333333333333 0.9517857142857142 0.9517857142857142 0.9515476190476191 0.9511904761904761 0.9511904761904761 0.9508333333333333 0.9511904761904761 0.9510714285714286 0.9514285714285714 0.9511904761904761 0.9517857142857142 0.9520238095238095 0.9511904761904761 0.9516666666666667 0.9502380952380952 0.9511904761904761 0.9510714285714286 0.950952380952381 0.9507142857142857 0.9508333333333333 0.950595238095238 0.9503571428571429 0.9504761904761905 0.9503571428571429 0.9503571428571429 0.9511904761904761 0.9501190476190476 0.9510714285714286 0.9498809523809524 0.9503571428571429 0.9489285714285715 0.9501190476190476 0.9501190476190476 0.9496428571428571 0.9501190476190476 0.95 0.9495238095238095 0.950595238095238 0.9495238095238095 0.9498809523809524 0.9495238095238095 0.9497619047619048 0.9489285714285715 0.9495238095238095 0.9494047619047619 0.9489285714285715 0.9492857142857143 0.9489285714285715 0.9495238095238095 0.9494047619047619 0.9486904761904762 0.9480952380952381 0.9484523809523809 0.9491666666666667 0.9485714285714286 0.9483333333333334 0.9485714285714286 0.9478571428571428 0.9479761904761905 0.9489285714285715 0.9484523809523809 0.9486904761904762 0.9485714285714286 0.9488095238095238 0.9472619047619047 0.9484523809523809 0.9486904761904762 0.9478571428571428 0.9476190476190476 0.9483333333333334 0.9484523809523809 0.9482142857142857 0.9482142857142857 0.9472619047619047 0.9477380952380953 0.9477380952380953 0.9477380952380953 0.9472619047619047 0.9479761904761905 0.9475 0.9477380952380953 0.9484523809523809 0.9483333333333334 0.9469047619047619 0.9475 0.9469047619047619 0.9470238095238095 0.9471428571428572 0.9480952380952381 0.9475 0.9471428571428572 0.9476190476190476 0.9476190476190476 0.9477380952380953 0.9470238095238095 0.9466666666666667 0.9467857142857142 0.9466666666666667 0.9471428571428572 0.9473809523809524 0.9475 0.9466666666666667 0.9465476190476191 0.9473809523809524 0.9476190476190476 0.9471428571428572 0.9464285714285714 0.9464285714285714 0.9471428571428572 0.9475 0.9475 0.9470238095238095 0.9471428571428572 0.9470238095238095 0.9472619047619047 0.9471428571428572 0.9465476190476191 0.9476190476190476 0.9473809523809524 0.9470238095238095 0.9470238095238095 0.945952380952381 0.9463095238095238 0.9466666666666667 0.9466666666666667 0.9464285714285714 0.9461904761904761 0.9467857142857142 0.9466666666666667 0.9464285714285714 0.9465476190476191 0.9466666666666667 0.9465476190476191 0.9469047619047619 0.945952380952381 0.9464285714285714 0.9461904761904761 0.9453571428571429 0.9464285714285714 0.9466666666666667 0.9467857142857142 0.9467857142857142 0.945595238095238 0.9466666666666667 0.9460714285714286 0.9460714285714286 0.9463095238095238 0.9458333333333333 0.9461904761904761 0.945952380952381 0.9463095238095238 0.945 0.9461904761904761 0.9451190476190476 0.9448809523809524 0.9453571428571429 0.9453571428571429 0.9451190476190476 0.9452380952380952 0.9448809523809524 0.945595238095238 0.9448809523809524 0.945 0.9451190476190476 0.9452380952380952 0.945595238095238 0.9447619047619048 0.9444047619047619 0.944047619047619 0.9445238095238095 0.9447619047619048 0.9446428571428571 0.9444047619047619 0.9445238095238095 0.9439285714285715 0.9442857142857143 0.9436904761904762 0.9438095238095238 0.9439285714285715 0.9434523809523809 0.9438095238095238 0.9439285714285715 0.9435714285714286 0.9435714285714286 0.9446428571428571 0.9436904761904762 0.944047619047619 0.9438095238095238 0.9442857142857143 0.944047619047619 0.9441666666666667 0.9441666666666667 0.9441666666666667 0.9445238095238095 0.9438095238095238 0.9438095238095238 0.9436904761904762 0.9434523809523809 0.9436904761904762 0.9442857142857143 0.9444047619047619 0.9439285714285715 0.9438095238095238 0.9439285714285715 0.9439285714285715 0.9438095238095238 0.9439285714285715 0.9435714285714286 0.9435714285714286 0.9436904761904762 0.9439285714285715 0.9434523809523809 0.9436904761904762 0.9435714285714286 0.9438095238095238 0.9438095238095238 0.9434523809523809 0.9438095238095238 0.9436904761904762 0.9432142857142857 0.9433333333333334 0.9430952380952381 0.9439285714285715 0.9433333333333334 0.9430952380952381 0.9429761904761905 0.9432142857142857 0.9434523809523809 0.9429761904761905 0.9428571428571428 0.9435714285714286 0.9436904761904762 0.9435714285714286 0.9433333333333334 0.9429761904761905 0.9432142857142857 0.9427380952380953 0.9433333333333334 0.9428571428571428 0.9426190476190476 0.9427380952380953 0.9428571428571428 0.9428571428571428 0.9430952380952381 0.9432142857142857 0.9429761904761905 0.9423809523809524 0.9430952380952381 0.9426190476190476 0.9428571428571428 0.9430952380952381 0.9422619047619047 0.9427380952380953 0.9423809523809524 0.9427380952380953 0.9427380952380953 0.9425 0.9427380952380953 0.9430952380952381 0.9429761904761905 0.9426190476190476 0.9423809523809524 0.9425 0.9426190476190476 0.9426190476190476 0.9416666666666667 0.9425 0.9421428571428572 0.9428571428571428 0.9428571428571428 0.9425 0.9425 0.9422619047619047 0.9426190476190476 0.9422619047619047 0.9427380952380953 0.9426190476190476 0.9426190476190476 0.9426190476190476 0.9417857142857143 0.9422619047619047 0.9425 0.9423809523809524 0.9419047619047619 0.9420238095238095 0.9419047619047619 0.9422619047619047 0.9416666666666667 0.9422619047619047 0.9417857142857143 0.9415476190476191 0.9417857142857143 0.9420238095238095 0.9417857142857143 0.9420238095238095 0.9419047619047619 0.9420238095238095 0.9420238095238095 0.9420238095238095 0.9415476190476191 0.9416666666666667 0.9417857142857143 0.9414285714285714 0.9414285714285714 0.9416666666666667 0.9422619047619047 0.9416666666666667 0.9411904761904762 0.9413095238095238 0.9414285714285714 0.9414285714285714 0.9415476190476191 0.9416666666666667 0.9410714285714286 0.940952380952381 0.9410714285714286 0.940952380952381 0.9413095238095238 0.9413095238095238 0.9408333333333333 0.9410714285714286 0.9411904761904762 0.9414285714285714 0.9408333333333333 0.9411904761904762 0.940952380952381 0.940952380952381 0.9413095238095238 0.9414285714285714 0.940595238095238 0.9410714285714286 0.9408333333333333 0.9408333333333333 0.9413095238095238 0.9404761904761905 0.9407142857142857 0.9408333333333333 0.9407142857142857 0.940952380952381 0.940952380952381 0.9408333333333333 0.9407142857142857 0.9411904761904762 0.940595238095238 0.9408333333333333 0.9410714285714286 0.9410714285714286 0.9408333333333333 0.9407142857142857 0.9407142857142857 0.9403571428571429 0.9401190476190476 0.9403571428571429 0.9404761904761905 0.9403571428571429 0.9407142857142857 0.940595238095238 0.9407142857142857 0.9402380952380952 0.940595238095238 0.9403571428571429 0.9408333333333333 0.9403571428571429 0.940595238095238 0.9403571428571429 0.94 0.9402380952380952 0.9404761904761905 0.9404761904761905 0.9401190476190476 0.9403571428571429 0.9403571428571429 0.9401190476190476 0.9402380952380952 0.9402380952380952 0.9402380952380952 0.940595238095238 0.9402380952380952 0.940595238095238 0.9401190476190476 0.9402380952380952 0.9401190476190476 0.9402380952380952 0.9404761904761905 0.9403571428571429 0.9403571428571429 0.9401190476190476 0.94 0.9401190476190476 0.9402380952380952 0.9404761904761905 0.9403571428571429 0.9403571428571429 0.9404761904761905 0.9402380952380952 0.9403571428571429 0.9404761904761905 0.9407142857142857 0.94 0.9402380952380952 0.9403571428571429 0.94 0.9401190476190476 0.9398809523809524 0.94 0.9397619047619048 0.94 0.9397619047619048 0.9401190476190476 0.94 0.9397619047619048 0.94 0.9403571428571429 0.9398809523809524 0.9398809523809524 0.9397619047619048 0.9398809523809524 0.94 0.9398809523809524 0.94 0.94 0.9402380952380952 0.9401190476190476 0.9398809523809524 0.9398809523809524 0.9403571428571429 0.9397619047619048 0.9401190476190476 0.9401190476190476 0.9402380952380952 0.9397619047619048 0.9397619047619048 0.9398809523809524 0.94 0.9401190476190476 0.9402380952380952 0.94 0.9401190476190476 0.9398809523809524 0.94 0.9401190476190476 0.9398809523809524 0.9401190476190476 0.94 0.9398809523809524 0.9401190476190476 0.9401190476190476 0.9398809523809524 0.9402380952380952 0.9401190476190476 0.9401190476190476 0.94 0.94 0.9401190476190476 0.9398809523809524 0.9398809523809524 0.9398809523809524 0.9398809523809524 0.9398809523809524 0.9398809523809524 0.9398809523809524 0.94 0.9401190476190476 0.9401190476190476 0.9401190476190476 0.9401190476190476 0.94 0.94 0.9401190476190476 0.9402380952380952 0.9401190476190476 0.9398809523809524 0.9402380952380952 0.9402380952380952 0.9402380952380952 0.9401190476190476 0.94 0.94 0.9398809523809524 0.94 0.94 0.94 0.94 0.94 0.9401190476190476 0.94 0.94 0.94 0.9401190476190476 0.9401190476190476 0.9401190476190476 0.94 0.94 0.9398809523809524 0.9398809523809524 0.9398809523809524 0.9397619047619048 0.9397619047619048 0.9394047619047619 0.9394047619047619 0.9394047619047619 0.9394047619047619 0.9394047619047619 0.9392857142857143 0.9391666666666667 0.939047619047619 0.939047619047619 0.939047619047619 0.939047619047619 0.9391666666666667 0.9391666666666667 0.9392857142857143 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667 0.9391666666666667
playlist_play
x
pca = PCA(n_components = 2)X_train_trf = pca.fit_transform(X_train)X_test_trf = pca.transform(X_test)[54]:
xxxxxxxxxxX_train_trf[54]:
array([[-2.71864483, -0.48981016],
[-0.67698414, -6.75405987],
[-3.03320724, 6.50918599],
...,
[ 2.14883779, 0.78075924],
[ 1.05955416, 0.94782655],
[17.70259619, 1.96190349]])[55]:
xxxxxxxxxximport plotly.express as pxy_train_trf = y_train.astype(str)fig = px.scatter(x = X_train_trf[:,0], y = X_train_trf[:,1], color = y_train_trf, color_discrete_sequence = px.colors.qualitative.G10 )fig.show()/opt/conda/lib/python3.10/site-packages/plotly/express/_core.py:2065: FutureWarning: When grouping with a length-1 list-like, you will need to pass a length-1 tuple to get_group in a future version of pandas. Pass `(name,)` instead of `name` to silence this warning. sf: grouped.get_group(s if len(s) > 1 else s[0])
playlist_play
x
pca = PCA(n_components = 3)X_train_trf = pca.fit_transform(X_train)X_test_trf = pca.transform(X_test)[57]:
xxxxxxxxxxy_train_trf = y_train.astype(str)fig = px.scatter_3d(df, x = X_train_trf[:,0], y = X_train_trf[:,1], z = X_train_trf[:,2], color = y_train_trf)fig.update_layout(margin = dict(l=20, r=20, t=20, b=20))fig.show()playlist_play
x
pca.explained_variance_ #Eigen Values[58]:
array([40.67111198, 29.17023399, 26.74459561])
playlist_play
x
pca.components_.shape #Eigen Vectors[60]:
(3, 784)
[ ]:
xxxxxxxxxxpca.explained_variance_ratio_[61]:
array([0.05785192, 0.0414927 , 0.03804239])
[ ]:
x
pca = PCA(n_components = None)X_train_trf = pca.fit_transform(X_train)X_test_trf = pca.transform(X_test)[ ]:
xxxxxxxxxxpca.explained_variance_.shape[63]:
(784,)
[ ]:
pca.components_.shape [64]:
(784, 784)
playlist_play
x
np.cumsum(pca.explained_variance_ratio_)[65]:
array([0.05785192, 0.09934462, 0.13738701, 0.16704964, 0.19286525,
0.21541506, 0.23514574, 0.25289854, 0.26858504, 0.28294568,
0.29664822, 0.30892077, 0.32038989, 0.33162017, 0.34220546,
0.35240477, 0.36198153, 0.37140862, 0.38053351, 0.38943521,
0.39783462, 0.40603525, 0.41388828, 0.42145568, 0.42882829,
0.43584714, 0.44282633, 0.44953738, 0.45593583, 0.4622392 ,
0.46839271, 0.4744166 , 0.48025334, 0.48603881, 0.49176214,
0.49722614, 0.50265822, 0.50793994, 0.51307376, 0.51801392,
0.52289063, 0.52771041, 0.53239246, 0.53698588, 0.54156249,
0.54605468, 0.5505015 , 0.55491691, 0.55926803, 0.56358836,
0.56780645, 0.57198888, 0.57606882, 0.58008887, 0.58407223,
0.58801822, 0.59188694, 0.59571889, 0.59948441, 0.60322635,
0.60687487, 0.6104829 , 0.61402176, 0.61750117, 0.62094441,
0.62432378, 0.62767368, 0.63099377, 0.63426257, 0.63746765,
0.64065542, 0.64376931, 0.64686564, 0.64992395, 0.65296104,
0.6559748 , 0.65895992, 0.66192105, 0.66483654, 0.66772536,
0.67060106, 0.6734651 , 0.67629295, 0.67910917, 0.68191711,
0.68468646, 0.68743018, 0.69013786, 0.69282727, 0.69548652,
0.6981275 , 0.70075516, 0.70335672, 0.7059274 , 0.70848658,
0.71103639, 0.71354984, 0.71605747, 0.71852606, 0.72096777,
0.72336938, 0.72575703, 0.72811046, 0.73043803, 0.73274819,
0.73504902, 0.73730548, 0.73955195, 0.74179168, 0.7440142 ,
0.74620053, 0.7483705 , 0.75051238, 0.7526383 , 0.75471786,
0.75677534, 0.758826 , 0.76083529, 0.76283446, 0.76482579,
0.76678392, 0.76873756, 0.77066405, 0.77258 , 0.77447592,
0.77635207, 0.77822334, 0.78007113, 0.78191268, 0.78373974,
0.78553564, 0.78731499, 0.78907371, 0.79080624, 0.79253247,
0.79425224, 0.79594416, 0.79762874, 0.79928778, 0.80093672,
0.80256446, 0.80416702, 0.80576592, 0.80735478, 0.80893549,
0.81050042, 0.81203815, 0.81356986, 0.81508446, 0.81658796,
0.81807414, 0.819548 , 0.82101567, 0.82247554, 0.82393208,
0.82538281, 0.8268107 , 0.82823494, 0.82965372, 0.83107178,
0.83248637, 0.83389408, 0.83528722, 0.83667705, 0.83806218,
0.83942504, 0.84078165, 0.84212383, 0.84346439, 0.84479466,
0.84611832, 0.84742054, 0.84871121, 0.84999906, 0.85127623,
0.85253753, 0.85378389, 0.85502004, 0.85624547, 0.85745736,
0.85866188, 0.85984637, 0.86101778, 0.86217663, 0.86332437,
0.86445325, 0.86558088, 0.8666952 , 0.86779646, 0.86888392,
0.86996579, 0.87104442, 0.87211364, 0.87317315, 0.87423006,
0.87527947, 0.87631514, 0.87734044, 0.87835265, 0.87936295,
0.88035936, 0.88134553, 0.88232488, 0.8832977 , 0.88427021,
0.88523831, 0.88619619, 0.88714161, 0.88807599, 0.88898991,
0.88989857, 0.89080345, 0.8916966 , 0.89258607, 0.89346635,
0.89433434, 0.89518395, 0.89603076, 0.89686706, 0.89768939,
0.89850803, 0.89932265, 0.90013248, 0.90093596, 0.90173482,
0.90251607, 0.90329546, 0.90406669, 0.90482606, 0.90557771,
0.90632501, 0.90706619, 0.90780234, 0.90853143, 0.90925438,
0.90996846, 0.91067525, 0.91137687, 0.9120734 , 0.91276691,
0.9134538 , 0.91412926, 0.91480048, 0.9154622 , 0.91612113,
0.91677252, 0.9174119 , 0.91804297, 0.91866944, 0.91929542,
0.91991348, 0.92052815, 0.9211373 , 0.92173916, 0.92233446,
0.92292101, 0.92350484, 0.92408706, 0.92466241, 0.92523367,
0.92579612, 0.926357 , 0.92691164, 0.92746163, 0.92800811,
0.92854974, 0.92908623, 0.92962044, 0.93015238, 0.93068315,
0.93120494, 0.93172158, 0.93223752, 0.93274696, 0.93325128,
0.93375434, 0.93425382, 0.93475251, 0.93524534, 0.93572834,
0.93620936, 0.93668669, 0.93716108, 0.93763137, 0.93809362,
0.93855416, 0.93901132, 0.93946516, 0.93991563, 0.94035889,
0.94080031, 0.94123623, 0.94166634, 0.94209054, 0.94251334,
0.9429314 , 0.94334561, 0.94375684, 0.94416693, 0.94457215,
0.94497536, 0.94537442, 0.94577212, 0.94616716, 0.94656128,
0.94694734, 0.94733105, 0.94771196, 0.94809051, 0.94846724,
0.94883789, 0.94920803, 0.94957156, 0.94992961, 0.95028537,
0.9506398 , 0.95099343, 0.9513436 , 0.95168786, 0.95202923,
0.9523691 , 0.95270547, 0.95303995, 0.95337297, 0.95370415,
0.95403451, 0.95436355, 0.95468948, 0.95500992, 0.95532964,
0.95564747, 0.95596177, 0.9562738 , 0.9565845 , 0.95689448,
0.95720249, 0.95750637, 0.95780805, 0.95810729, 0.95840523,
0.95870234, 0.9589963 , 0.95928521, 0.9595721 , 0.95985691,
0.9601397 , 0.9604215 , 0.96070051, 0.96097721, 0.96125114,
0.9615244 , 0.96179631, 0.96206536, 0.9623333 , 0.96259692,
0.96285944, 0.9631192 , 0.96337626, 0.96363152, 0.9638845 ,
0.96413674, 0.96438723, 0.96463661, 0.96488486, 0.96513187,
0.96537698, 0.96562037, 0.96586205, 0.96610289, 0.96634084,
0.96657869, 0.96681272, 0.96704657, 0.96727896, 0.96750876,
0.96773538, 0.96796067, 0.96818505, 0.968409 , 0.96863013,
0.96885103, 0.96907065, 0.96928885, 0.96950343, 0.96971703,
0.96992842, 0.97013804, 0.9703469 , 0.97055541, 0.97076129,
0.97096617, 0.9711703 , 0.97137304, 0.97157367, 0.97177272,
0.97196898, 0.97216465, 0.97235955, 0.97255284, 0.97274564,
0.97293686, 0.97312619, 0.97331534, 0.97350238, 0.97368816,
0.97387195, 0.9740554 , 0.9742372 , 0.97441869, 0.9745998 ,
0.97477959, 0.97495709, 0.97513385, 0.9753098 , 0.97548507,
0.97565906, 0.97583276, 0.97600475, 0.97617632, 0.97634726,
0.97651789, 0.97668656, 0.97685386, 0.9770202 , 0.97718552,
0.97735024, 0.97751419, 0.97767708, 0.97783857, 0.97799968,
0.97816011, 0.97831997, 0.97847803, 0.97863471, 0.97879125,
0.97894717, 0.97910129, 0.97925534, 0.9794075 , 0.97955873,
0.97970896, 0.97985833, 0.98000708, 0.9801544 , 0.98030149,
0.98044778, 0.98059235, 0.98073638, 0.98088019, 0.98102317,
0.98116567, 0.98130745, 0.98144769, 0.98158734, 0.98172653,
0.98186487, 0.9820027 , 0.98213961, 0.98227604, 0.98241149,
0.98254653, 0.98268118, 0.98281498, 0.98294806, 0.98307997,
0.98321127, 0.98334209, 0.98347147, 0.98360011, 0.98372841,
0.98385617, 0.983983 , 0.98410924, 0.98423434, 0.98435884,
0.98448278, 0.98460617, 0.98472912, 0.984852 , 0.98497324,
0.98509422, 0.98521456, 0.98533478, 0.98545394, 0.9855726 ,
0.98569101, 0.98580854, 0.98592469, 0.9860402 , 0.98615511,
0.98626975, 0.98638343, 0.9864965 , 0.98660857, 0.98672004,
0.98683101, 0.98694155, 0.98705146, 0.9871611 , 0.98727017,
0.98737845, 0.98748622, 0.98759382, 0.98770049, 0.98780695,
0.98791282, 0.98801809, 0.98812308, 0.98822752, 0.98833112,
0.98843461, 0.98853712, 0.98863913, 0.98874026, 0.98884103,
0.98894147, 0.98904132, 0.98914006, 0.98923872, 0.98933657,
0.98943426, 0.9895316 , 0.98962858, 0.98972495, 0.98982042,
0.98991553, 0.99001034, 0.99010474, 0.99019869, 0.99029232,
0.99038547, 0.99047802, 0.99056952, 0.99066046, 0.99075123,
0.9908412 , 0.99093095, 0.99102027, 0.99110878, 0.99119686,
0.991284 , 0.99137064, 0.99145684, 0.99154301, 0.9916287 ,
0.99171383, 0.99179822, 0.99188227, 0.99196602, 0.99204914,
0.99213215, 0.99221454, 0.99229625, 0.9923773 , 0.99245801,
0.99253818, 0.99261801, 0.99269711, 0.99277575, 0.99285437,
0.99293242, 0.99301037, 0.99308714, 0.99316362, 0.99323997,
0.99331588, 0.99339135, 0.99346645, 0.99354112, 0.99361543,
0.99368914, 0.99376232, 0.9938352 , 0.99390791, 0.99398013,
0.99405219, 0.99412331, 0.99419377, 0.994264 , 0.99433386,
0.99440353, 0.99447278, 0.99454165, 0.99461022, 0.99467846,
0.99474631, 0.99481384, 0.99488084, 0.99494741, 0.99501349,
0.99507947, 0.99514485, 0.99520937, 0.99527369, 0.99533783,
0.99540182, 0.99546534, 0.99552812, 0.99559085, 0.99565334,
0.99571532, 0.99577699, 0.99583833, 0.99589931, 0.9959597 ,
0.99601994, 0.99608004, 0.9961398 , 0.99619902, 0.99625806,
0.99631668, 0.99637498, 0.99643246, 0.99648964, 0.99654658,
0.99660348, 0.99666008, 0.99671624, 0.99677226, 0.99682789,
0.99688332, 0.99693851, 0.99699341, 0.99704791, 0.99710219,
0.99715597, 0.99720952, 0.99726274, 0.99731563, 0.99736803,
0.99742014, 0.99747165, 0.99752302, 0.99757402, 0.99762466,
0.99767508, 0.99772515, 0.99777484, 0.99782431, 0.99787324,
0.99792198, 0.99797022, 0.99801832, 0.99806623, 0.99811294,
0.99815934, 0.99820553, 0.9982516 , 0.99829736, 0.99834261,
0.99838751, 0.99843215, 0.99847669, 0.99852092, 0.99856455,
0.99860796, 0.9986511 , 0.99869369, 0.99873568, 0.99877733,
0.9988188 , 0.99886 , 0.99890061, 0.99894074, 0.99898074,
0.99902045, 0.99905956, 0.99909851, 0.99913737, 0.99917557,
0.99921352, 0.99925108, 0.99928825, 0.99932528, 0.99936201,
0.99939802, 0.99943366, 0.99946908, 0.99950427, 0.99953896,
0.99957313, 0.99960703, 0.99964055, 0.99967369, 0.99970635,
0.9997389 , 0.99977105, 0.99980226, 0.9998326 , 0.99986238,
0.99989173, 0.99992013, 0.99994621, 0.99997163, 0.99998665,
0.99999752, 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. ])[ ]:
plt.plot(np.cumsum(pca.explained_variance_ratio_))[66]:
[<matplotlib.lines.Line2D at 0x7a9f641185e0>]
[ ]:
- __notebook_source__.ipynb